51 research outputs found

    Lazy Approaches for Interval Timing Correlation of Sensor Data Streams

    Get PDF
    We propose novel algorithms for the timing correlation of streaming sensor data. The sensor data are assumed to have interval timestamps so that they can represent temporal uncertainties. The proposed algorithms can support efficient timing correlation for various timing predicates such as deadline, delay, and within. In addition to the classical techniques, lazy evaluation and result cache are utilized to improve the algorithm performance. The proposed algorithms are implemented and compared under various workloads

    Design of an E-SCM System for an Auto-parts Companies

    Get PDF
    This paper deals with the design of e-SCM for a supply chain of a commercial vehicle company. Two first vendors and two second vendors of the company are included in the objects of developing e-SCM system. We analyzed the information flow and business process between 1st vendor and 2nd vendors. The database (DB) has to be designed for contents of easy access, correction and update, because DB is indispensable for developing the new system. A relational database concept was used to design tables which maintain data needed in the system. The paper uses data flow diagram to describe TO-BE model of the e-SCM system

    Harnessing the power of diffusion models for plant disease image augmentation

    Get PDF
    IntroductionThe challenges associated with data availability, class imbalance, and the need for data augmentation are well-recognized in the field of plant disease detection. The collection of large-scale datasets for plant diseases is particularly demanding due to seasonal and geographical constraints, leading to significant cost and time investments. Traditional data augmentation techniques, such as cropping, resizing, and rotation, have been largely supplanted by more advanced methods. In particular, the utilization of Generative Adversarial Networks (GANs) for the creation of realistic synthetic images has become a focal point of contemporary research, addressing issues related to data scarcity and class imbalance in the training of deep learning models. Recently, the emergence of diffusion models has captivated the scientific community, offering superior and realistic output compared to GANs. Despite these advancements, the application of diffusion models in the domain of plant science remains an unexplored frontier, presenting an opportunity for groundbreaking contributions.MethodsIn this study, we delve into the principles of diffusion technology, contrasting its methodology and performance with state-of-the-art GAN solutions, specifically examining the guided inference model of GANs, named InstaGAN, and a diffusion-based model, RePaint. Both models utilize segmentation masks to guide the generation process, albeit with distinct principles. For a fair comparison, a subset of the PlantVillage dataset is used, containing two disease classes of tomato leaves and three disease classes of grape leaf diseases, as results on these classes have been published in other publications.ResultsQuantitatively, RePaint demonstrated superior performance over InstaGAN, with average FrĆ©chet Inception Distance (FID) score of 138.28 and Kernel Inception Distance (KID) score of 0.089 Ā± (0.002), compared to InstaGANā€™s average FID and KID scores of 206.02 and 0.159 Ā± (0.004) respectively. Additionally, RePaintā€™s FID scores for grape leaf diseases were 69.05, outperforming other published methods such as DCGAN (309.376), LeafGAN (178.256), and InstaGAN (114.28). For tomato leaf diseases, RePaint achieved an FID score of 161.35, surpassing other methods like WGAN (226.08), SAGAN (229.7233), and InstaGAN (236.61).DiscussionThis study offers valuable insights into the potential of diffusion models for data augmentation in plant disease detection, paving the way for future research in this promising field

    Deep learning-based classification with improved time resolution for physical activities of children

    Get PDF
    Background The proportion of overweight and obese people has increased tremendously in a short period, culminating in a worldwide trend of obesity that is reaching epidemic proportions. Overweight and obesity are serious issues, especially with regard to children. This is because obese children have twice the risk of becoming obese as adults, as compared to non-obese children. Nowadays, many methods for maintaining a caloric balance exist; however, these methods are not applicable to children. In this study, a new approach for helping children monitor their activities using a convolutional neural network (CNN) is proposed, which is applicable for real-time scenarios requiring high accuracy. Methods A total of 136 participants (86 boys and 50 girls), aged between 8.5 years and 12.5 years (mean 10.5, standard deviation 1.1), took part in this study. The participants performed various movement while wearing custom-made three-axis accelerometer modules around their waists. The data acquired by the accelerometer module was preprocessed by dividing them into small sets (128 sample points for 2.8Ā s). Approximately 183,600 data samples were used by the developed CNN for learning to classify ten physical activities : slow walking, fast walking, slow running, fast running, walking up the stairs, walking down the stairs, jumping rope, standing up, sitting down, and remaining still. Results The developed CNN classified the ten activities with an overall accuracy of 81.2%. When similar activities were merged, leading to seven merged activities, the CNN classified activities with an overall accuracy of 91.1%. Activity merging also improved performance indicators, for the maximum case of 66.4% in recall, 48.5% in precision, and 57.4% in f1 score . The developed CNN classifier was compared to conventional machine learning algorithms such as the support vector machine, decision tree, and k-nearest neighbor algorithms, and the proposed CNN classifier performed the best: CNN (81.2%) > SVM (64.8%) > DT (63.9%) > kNN (55.4%) (for ten activities); CNN (91.1%) > SVM (74.4%) > DT (73.2%) > kNN (65.3%) (for the merged seven activities). Discussion The developed algorithm distinguished physical activities with improved time resolution using short-time acceleration signals from the physical activities performed by children. This study involved algorithm development, participant recruitment, IRB approval, custom-design of a data acquisition module, and data collection. The self-selected moving speeds for walking and running (slow and fast) and the structure of staircase degraded the performance of the algorithm. However, after similar activities were merged, the effects caused by the self-selection of speed were reduced. The experimental results show that the proposed algorithm performed better than conventional algorithms. Owing to its simplicity, the proposed algorithm could be applied to real-time applicaitons

    A systematic review of progress on hepatocellular carcinoma research over the past 30 years: a machine-learning-based bibliometric analysis

    Get PDF
    IntroductionResearch on hepatocellular carcinoma (HCC) has grown significantly, and researchers cannot access the vast amount of literature. This study aimed to explore the research progress in studying HCC over the past 30 years using a machine learning-based bibliometric analysis and to suggest future research directions.MethodsComprehensive research was conducted between 1991 and 2020 in the public version of the PubMed database using the MeSH term ā€œhepatocellular carcinoma.ā€ The complete records of the collected results were downloaded in Extensible Markup Language format, and the metadata of each publication, such as the publication year, the type of research, the corresponding authorā€™s country, the title, the abstract, and the MeSH terms, were analyzed. We adopted a latent Dirichlet allocation topic modeling method on the Python platform to analyze the research topics of the scientific publications.ResultsIn the last 30 years, there has been significant and constant growth in the annual publications about HCC (annual percentage growth rate: 7.34%). Overall, 62,856 articles related to HCC from the past 30 years were searched and finally included in this study. Among the diagnosis-related terms, ā€œLiver Cirrhosisā€ was the most studied. However, in the 2010s, ā€œBiomarkers, Tumorā€ began to outpace ā€œLiver Cirrhosis.ā€ Regarding the treatment-related MeSH terms, ā€œHepatectomyā€ was the most studied; however, recent studies related to ā€œAntineoplastic Agentsā€ showed a tendency to supersede hepatectomy. Regarding basic research, the study of ā€œCell Lines, Tumors,ā€™ā€™ appeared after 2000 and has been the most studied among these terms.ConclusionThis was the first machine learning-based bibliometric study to analyze more than 60,000 publications about HCC over the past 30 years. Despite significant efforts in analyzing the literature on basic research, its connection with the clinical field is still lacking. Therefore, more efforts are needed to convert and apply basic research results to clinical treatment. Additionally, it was found that microRNAs have potential as diagnostic and therapeutic targets for HCC

    Coupling effects on turning points of infectious diseases epidemics in scale-free networks

    No full text
    Abstract Background Pandemic is a typical spreading phenomenon that can be observed in the human society and is dependent on the structure of the social network. The Susceptible-Infective-Recovered (SIR) model describes spreading phenomena using two spreading factors; contagiousness (Ī²) and recovery rate (Ī³). Some network models are trying to reflect the social network, but the real structure is difficult to uncover. Methods We have developed a spreading phenomenon simulator that can input the epidemic parameters and network parameters and performed the experiment of disease propagation. The simulation result was analyzed to construct a new marker VRTP distribution. We also induced the VRTP formula for three of the network mathematical models. Results We suggest new marker VRTP (value of recovered on turning point) to describe the coupling between the SIR spreading and the Scale-free (SF) network and observe the aspects of the coupling effects with the various of spreading and network parameters. We also derive the analytic formulation of VRTP in the fully mixed model, the configuration model, and the degree-based model respectively in the mathematical function form for the insights on the relationship between experimental simulation and theoretical consideration. Conclusions We discover the coupling effect between SIR spreading and SF network through devising novel marker VRTP which reflects the shifting effect and relates to entropy

    Network Analysis to Identify the Risk of Epidemic Spreading

    No full text
    Several epidemics, such as the Black Death and the Spanish flu, have threatened human life throughout history; however, it is unclear if humans will remain safe from the sudden and fast spread of epidemic diseases. Moreover, the transmission characteristics of epidemics remain undiscovered. In this study, we present the results of an epidemic simulation experiment revealing the relationship between epidemic parameters and pandemic risk. To analyze the time-dependent risk and impact of epidemics, we considered two parameters for infectious diseases: the recovery time from infection and the transmission rate of the disease. Based on the epidemic simulation, we identified two important aspects of human safety with regard to the threat of a pandemic. First, humans should be safe if the fatality rate is below 100%. Second, even when the fatality rate is 100%, humans would be safe if the average degree of human social networks is below a threshold value. Nevertheless, certain diseases can potentially infect all nodes in the human social networks, and these diseases cause a pandemic when the average degree is larger than the threshold value. These results indicated that certain infectious diseases lead to human extinction and can be prevented by minimizing human contact

    Rare Earth Elements and Other Critical Metals in Deep Seabed Mineral Deposits: Composition and Implications for Resource Potential

    No full text
    The critical metal contents of four types of seabed mineral resources, including a deep-sea sediment deposit, are evaluated as potential rare earth element (REE) resources. The deep-sea resources have relatively low total rare earth oxide (TREO) contents, a narrow range of TREO grades (0.049⁻0.185%), and show characteristics that are consistent with those of land-based ion adsorption REE deposits. The relative REO distributions of the deep-seabed resources are also consistent with those of ion adsorption REE deposits on land. REEs that are not part of a crystal lattice of host minerals within deep-sea mineral deposits are favorable for mining, as there is no requirement for crushing and/or pulverizing during ore processing. Furthermore, low concentrations of Th and U reduce the risk of adverse environmental impacts. Despite the low TREO grades of the deep-seabed mineral deposits, a significant TREO yield from polymetallic nodules and REE-bearing deep-sea sediments from the Korean tenements has been estimated (1 Mt and 8 Mt, respectively). Compared with land-based REE deposits, deep-sea mineral deposits can be considered as low-grade mineral deposits with a large tonnage. The REEs and critical metals from deep-sea mineral deposits are important by-products and co-products of the main commodities (e.g., Co and Ni), and may increase the economic feasibility of their extraction

    A skeletal Sr/Ca record preserved inDipsastraea(Favia)speciosaand implications for coral Sr/Ca thermometry in mid-latitude regions

    Get PDF
    [1] A core (900 mm long) of the scleractinian coral Dipsastraea (Favia) speciosa was collected from Iki Island (āˆ¼33Ā°48ā€²N), Japan, one of the highest latitude coral reefs known to exist at present, where winter monthly mean sea surface temperature (SST) drops to 13Ā°C. The Sr/Ca profile was constructed using a bulk sampling method for the uppermost 280 mm interval of the core, which grew between 1966 and 2007, to test whether it could act as a suitable proxy for SST in a harsh environmental setting where reef-building coral do not usually survive. The Sr/Ca-SST relationship derived from the annual Sr/Ca and SST extremes predicted the observed monthly averaged summer SST extremes within an error range of Ā±1.1Ā°C (1 s.d., nā€‰=ā€‰40). The obtained Sr/Ca-SST calibration was also found to be valid for subtropical Dipsastraea (Favia) corals, proving its broad applicability. However, low-amplitude winter peaks were observed in the slow-growing intervals, which we confirmed (using individual spot analysis along a continuous growth line) result from the mixing of theca grown at different times. Our bulk sampling approach, across multiple growth lines in the skeleton of D. (F.) speciosa, led to the mixing of asynchronous skeletal part. At the study site, D. (F.) speciosa grows continuously, even during the cold season, suggesting that the skeletal Sr/Ca obtained from specimens of D. (F.) speciosa can be used as an SST proxy in the northwest Pacific marginal seas
    • ā€¦
    corecore